Fundamental Tensor Operations for Large-Scale Data Analysis in Tensor Train Formats
نویسندگان
چکیده
We discuss extended definitions of linear and multilinear operations such as Kronecker, Hadamard, and contracted products, and establish links between them for tensor calculus. Then we introduce effective low-rank tensor approximation techniques including Candecomp/Parafac (CP), Tucker, and tensor train (TT) decompositions with a number of mathematical and graphical representations. We also provide a brief review of mathematical properties of the TT decomposition as a low-rank approximation technique. With the aim of breaking the curseof-dimensionality in large-scale numerical analysis, we describe basic operations on large-scale vectors, matrices, and high-order tensors represented by TT decomposition. The proposed representations can be used for describing numerical methods based on TT decomposition for solving large-scale optimization problems such as systems of linear equations and symmetric eigenvalue problems.
منابع مشابه
Tensor Networks for Big Data Analytics and Large-Scale Optimization Problems
Tensor decompositions and tensor networks are emerging and promising tools for data analysis and data mining. In this paper we review basic and emerging models and associated algorithms for large-scale tensor networks, especially Tensor Train (TT) decompositions using novel mathematical and graphical representations. We discus the concept of tensorization (i.e., creating very high-order tensors...
متن کاملA simple form of MT impedance tensor analysis to simplify its decomposition to remove the effects of near surface small-scale 3-D conductivity structures
Magnetotelluric (MT) is a natural electromagnetic (EM) technique which is used for geothermal, petroleum, geotechnical, groundwater and mineral exploration. MT is also routinely used for mapping of deep subsurface structures. In this method, the measured regional complex impedance tensor (Z) is substantially distorted by any topographical feature or small-scale near-surface, three-dimensional (...
متن کاملPrincipal Component Analysis with Tensor Train Subspace
Tensor train is a hierarchical tensor network structure that helps alleviate the curse of dimensionality by parameterizing large-scale multidimensional data via a set of network of low-rank tensors. Associated with such a construction is a notion of Tensor Train subspace and in this paper we propose a TTPCA algorithm for estimating this structured subspace from the given data. By maintaining lo...
متن کاملSubspace Methods with Local Refinements for Eigenvalue Computation Using Low-Rank Tensor-Train Format
Computing a few eigenpairs from large-scale symmetric eigenvalue problems is far beyond the tractability of classic eigensolvers when the storage of the eigenvectors in the classical way is impossible. We consider a tractable case in which both the coefficient matrix and its eigenvectors can be represented in the low-rank tensor train formats. We propose a subspace optimization method combined ...
متن کاملTensor Networks for Latent Variable Analysis. Part I: Algorithms for Tensor Train Decomposition
Decompositions of tensors into factor matrices, which interact through a core tensor, have found numerous applications in signal processing and machine learning. A more general tensor model which represents data as an ordered network of sub-tensors of order-2 or order-3 has, so far, not been widely considered in these fields, although this so-called tensor network decomposition has been long st...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1405.7786 شماره
صفحات -
تاریخ انتشار 2014